1,116 research outputs found

    Erhöhung der Löslichkeit und Stabilität verschiedener Proteine durch rationales Design und gelenkte Evolution

    Get PDF
    Zur biophysikalischen Charakterisierung oder zur Aufklärung der Struktur eines Proteins muss dieses rekombinant in löslicher Form und in großen Mengen produziert werden. Die intrinsische Instabilität vieler eukaryotischer Proteine macht dies jedoch oft sehr schwierig bzw. gänzlich unmöglich. Obwohl dieses Problem in manchen Fällen durch Variation der Expressions- und/oder Reinigungsbedingungen umgangen werden kann, ist in vielen Fällen eine Optimierung der Aminosäuresequenz mittels Mutagenese erforderlich. Bei Kenntnis der Struktur eines Proteins oder bei Vorliegen eines zuverlässigen Modells kann die Optimierung durch rationales Design erfolgen. Im ersten Projekt dieser Arbeit wurde auf der Basis mehrerer Homologie-Modelle die Löslichkeit der N-terminalen Domäne der humanen Proprotein Convertase Subtilisin/Kexin Typ 9 durch den Austausch 15 hydrophober Aminosäuren der Proteinoberfläche gegen hydrophilere Reste signifikant erhöht: die Ausbeute an gereinigtem Protein stieg auf das Fünffache des Ausgangswertes an und die zur Ausfällung von 50 % des Proteins benötigte Konzentration an Ammoniumsulfat erhöhte sich von 0,23 M auf 0,52 M. Allerdings war die optimierte Proteinvariante etwas weniger stabil gegenüber chemischer Denaturierung: die zur Entfaltung von 50 % des Proteins benötige Konzentration an Harnstoff erniedrigte sich von 3,75 M auf 3,02 M. Diese Ergebnisse zeigen, dass Löslichkeit und konformationelle Stabilität eines Proteins nicht in jedem Fall korreliert sein müssen. In Abwesenheit struktureller Information kann die Aminosäuresequenz eines Proteins mittels gelenkter Evolution optimiert werden. Dabei wird zunächst durch Zufallsmutagenese ein großes Repertoire an Proteinvarianten erzeugt, aus dem anschließend durch effiziente Selektions- oder Screening-Verfahren diejenigen mit den besten Eigenschaften isoliert werden können. Im zentralen zweiten Teil dieser Arbeit wurde gelenkte Evolution zur Erhöhung der Löslichkeit und Stabilität der Ligand-Bindungsdomäne des humanen Glucocorticoid-Rezeptors (hGR-LBD) durchgeführt, wobei fusioniertes grün fluoreszierendes Protein (GFP) als Reporter diente. GFP weist ein leicht messbares Fluoreszenzsignal auf, dessen Intensität von der Löslichkeit bzw. Stabilität des fusionierten hGR-LBD abhängt und im Hochdurchsatz Screening-Verfahren mittels Fluoreszenz-aktivierter Zellsortierung (FACS) ausgelesen werden kann. Zunächst wurden durch Zufallsmutagenese mittels error prone PCR zwei plasmidkodierte hgr-lbd-egfp Genbanken mit hoher bzw. niedriger Mutationsfrequenz hergestellt. Anschließend wurden die mit den beiden Banken transformierten und gemischten E. coli Zellen insgesamt acht FACS-Anreicherungsrunden unterzogen, wobei schrittweise nach höherer GFP-Fluoreszenz im Hochdurchsatz (> 10.000 Zellen/Sekunde) sortiert wurde. Hierdurch gelang die Isolierung von insgesamt fünf hGR-LBD Varianten (vier aus Anreicherungsrunde 5 und eine aus Anreicherungsrunde 8), die alle aus der hochmutagenen Genbank stammten und insgesamt 30 unterschiedliche Aminosäurenaustausche enthielten. Von diesen 30 Austauschen wurden 14 einzeln getestet, wobei vier Mutationen identifiziert werden konnten, welche, ähnlich wie die in der Literatur beschriebene Mutation F602S (Bledsoe et al., 2002), zu einer Erhöhung der Fluoreszenz des fusionierten GFP und zu einem Anstieg der Löslichkeit von hGR-LBD in Abwesenheit von GFP führen. Die Kombination der vorteilhaften Mutationen zeigte, dass die Effekte der einzelnen Austausche auf die Fluoreszenz und die Löslichkeit additiv sind und ein linearer Zusammenhang zwischen den beiden Parametern besteht. Die erhöhte Löslichkeit der optimierten hGR-LBD-Variante ermöglichte ihre Reinigung mit hoher Ausbeute und die Untersuchung ihrer konformationellen Stabilität mittels thermischer Auffaltung. Es zeigte sich, dass alle Austausche den apparenten Schmelzpunkt sowohl mit gebundenem Agonisten Dexamethason als auch (mit einer Ausnahme) mit gebundenem Antagonisten Mifepriston erhöhen und ein linearer Zusammenhang zwischen der Fluoreszenz von fusioniertem GFP und der thermischen Stabilität von hGR-LBD in Abwesenheit von GFP besteht. Die Einführung der in dieser Arbeit identifizierten vorteilhaften Mutationen führt zu einer 26-fach verbesserten Reinigungsausbeute des Proteins aus der löslichen Zellfraktion von E. coli. Darüber hinaus wird durch die Austausche der apparente Schmelzpunkt sowohl der Agonist- als auch der Antagonist-Konformation um ca. 8 °C erhöht. Im Laufe des FACS-basierten Hochdurchsatz-Screenings wurden verschiedene Artefakte beobachtet, welche die Isolierung optimierter hGR-LBD Varianten erschwerten. Die Ursachen dieser Artefakte, die sowohl auf Gen- als auch auf Proteinebene auftreten können, wurden aufgeklärt und Möglichkeiten zu ihrer Identifizierung und Umgehung erarbeitet. Drei der stabilisierenden Mutationen ermöglichten es, in Kooperation mit F.Hoffmann – LaRoche (Basel) die Röntgenkristallstruktur der homologen GR-LBD aus der Maus (mGR-LBD; 95,3 % Sequenz-Identität mit hGR-LBD) mit einer verbesserten Auflösung von 1,5 Å anstelle der bisher erzielten 1,9 Å aufzuklären. Durch die Analyse der Struktur konnten bisher nicht bekannte atomare Details sichtbar gemacht und der Effekt der stabilisierenden Mutationen erklärt werden. In weiterführenden Arbeiten können nun die optimierten GR-LBD-Varianten zur Strukturaufklärung des Proteins mit neuen Ligand-Leitstrukturen verwendet werden, was bei der Entwicklung pharmazeutischer Wirkstoffe von Nutzen sein kann. Im dritten Teil der Arbeit wurde ein in meiner Diplomarbeit begonnenes Projekt zur Stabilisierung des artifiziellen (ba)8-Barrel Proteins HisF-C*C, welches aus zwei identischen, fusionierten C-terminalen Hälften von tmHisF (Synthase-Untereinheit der Imidazolglycerinphosphat-Synthase aus Thermotoga maritima) besteht, mit der Aufklärung einer hoch aufgelösten Röntgenkristallstruktur zum vorläufigen Abschluss gebracht. Dieses Projekt resultierte in zwei Publikationen (Seitz et al., 2007; Höcker et al., 2009), die sich im Anhang zu dieser Arbeit finden

    Geometry identification and data enhancement for distributed flow measurements

    Get PDF
    The measurement of fluid motion is an important tool for researchers in fluiddynamics. Measurements with increasing precision did expedite the development of fluid-dynamic models and their theoretical understanding. Several well-established experimental techniques provide point-wise information on the flow field. In recent years novel measurement modalities have been investigated which deliver spatially resolved three-dimensional velocity measurements. Note that for methods such as particle tracking and tomographic particle imaging optical access to the flow domain is necessary. For other methods like magnetic resonance velocimetry, CT-angiography, or x-ray velocimetry this is, however, not the case. Such a property and also the fact that those methods are able to provide three-dimensional velocity fields in a rather short acquisition time makes them in particular suited for in-vivo applications. Our work is motivated by such non-invasive velocity measurement techniques for which no optical access to the interior of the geometry is needed and also not available in many cases. Here, an additional difficulty is that the exact flow geometry is in general not known a priori. The measurement techniques we are interested in, are extensions of already available medical imaging modalities. As a prototypical example, we consider magnetic resonance velocimetry, which is also suited for the measurement of turbulent fluid motion. We will also discuss computational examples using such measurement data. General purpose. Our main goal is a suitable post-processing of the available velocity data and also to obtain additional information. The measurements available from magnetic resonance velocimetry consist of several components given on a fixed field of view. The magnitude of the MRT signal corresponds to a proton density and thus e.g. the density of water molecules. Those data typically give a clear indication of the position and size of the flow geometry. The velocity data, on the other hand, are substantially perturbed outside the flow domain. This is a typical feature of measurements stemming from magnetic resonance velocimetry. Note that the surrounding noise usually has a notably higher magnitude than the actual measurements. Thus, a first necessary step will be to somehow separate the domain containing valuable velocity data from the noise surrounding it. For this reason, we apply some kind of image segmentation where we make use of the given den-sity image. Since the velocity values are given on the same field of view the segmentation directly transfers to those data. Due to the measurement procedure also the segmented velocity data are contaminated by measurement errors. Therefore, besides segmentation, additional post-processing is necessary in order to make the flow measurements available for further usage. In a second step, we propose a problem adapted data enhancement method which is able to provide a smoothed velocity field on the one hand, and also provides additional information on the other hand, like for instance the pressure drop or an estimate for the wall shear stress. The two main steps will therefore be: (i) The identification of the flow geometry, where we make use of the available density measurements. (ii) The denoising and improvement of the segmented velocity data, by using a suitable fluid-dynamical model. Outline. In part I of this thesis, we introduce our basic approach to the geom- etry identification and velocity enhancement problems described above. Both problems are formulated as optimal control problems governed by a partial differential equation and we shortly discuss some general aspects of the analysis and the solution of such problems in section 4. In part II, we thoroughly discuss and analyze the geometry identification problem introduced in section 2. The procedure is formulated as an inverse ill-posed problem and we propose a Tikhonov regularization for its stable solution. We show that the resulting optimal control problem has a solution and discuss its numerical treatment with iterative methods. Finally, a systematic discretization can be realized using finite elements which is also demonstrated by numerical tests. The velocity enhancement problem is introduced in part III. We propose a linearized flow-model which directly incorporates the available measurements. The resulting modeling error can be quantified in terms of the data error. The reconstruction method is then formulated as an optimal control problem subject to the linearized equations. We show the existence of a unique solution and derive estimates for the reconstruction error. Additionally, a reconstruction for the pressure is obtained for which we derive similar error estimates. We discuss the systematic discretization using finite elements and show preliminary computational examples for the verification of the derived estimates. In order to verify the applicability of the proposed methods to realistic data, we consider an application using experimental data in part IV. We use measurements of a human blood vessel stemming from magnetic resonance velocimetry obtained at the University Medical Center in Freiburg. After a suitable pre-processing of the available data, we apply the geometry identification method in order to obtain a discretization of the blood vessel. Using the generated mesh, we reconstruct an enhanced velocity field and the pressure from the available velocity data

    Supporting users in password authentication with persuasive design

    Get PDF
    Activities like text-editing, watching movies, or managing personal finances are all accomplished with web-based solutions nowadays. The providers need to ensure security and privacy of user data. To that end, passwords are still the most common authentication method on the web. They are inexpensive and easy to implement. Users are largely accustomed to this kind of authentication but passwords represent a considerable nuisance, because they are tedious to create, remember, and maintain. In many cases, usability issues turn into security problems, because users try to work around the challenges and create easily predictable credentials. Often, they reuse their passwords for many purposes, which aggravates the risk of identity theft. There have been numerous attempts to remove the root of the problem and replace passwords, e.g., through biometrics. However, no other authentication strategy can fully replace them, so passwords will probably stay a go-to authentication method for the foreseeable future. Researchers and practitioners have thus aimed to improve users' situation in various ways. There are two main lines of research on helping users create both usable and secure passwords. On the one hand, password policies have a notable impact on password practices, because they enforce certain characteristics. However, enforcement reduces users' autonomy and often causes frustration if the requirements are poorly communicated or overly complex. On the other hand, user-centered designs have been proposed: Assistance and persuasion are typically more user-friendly but their influence is often limited. In this thesis, we explore potential reasons for the inefficacy of certain persuasion strategies. From the gained knowledge, we derive novel persuasive design elements to support users in password authentication. The exploration of contextual factors in password practices is based on four projects that reveal both psychological aspects and real-world constraints. Here, we investigate how mental models of password strength and password managers can provide important pointers towards the design of persuasive interventions. Moreover, the associations between personality traits and password practices are evaluated in three user studies. A meticulous audit of real-world password policies shows the constraints for selection and reuse practices. Based on the review of context factors, we then extend the design space of persuasive password support with three projects. We first depict the explicit and implicit user needs in password support. Second, we craft and evaluate a choice architecture that illustrates how a phenomenon from marketing psychology can provide new insights into the design of nudging strategies. Third, we tried to empower users to create memorable passwords with emojis. The results show the challenges and potentials of emoji-passwords on different platforms. Finally, the thesis presents a framework for the persuasive design of password support. It aims to structure the required activities during the entire process. This enables researchers and practitioners to craft novel systems that go beyond traditional paradigms, which is illustrated by a design exercise.Heutzutage ist es möglich, mit web-basierten Lösungen Texte zu editieren, Filme anzusehen, oder seine persönlichen Finanzen zu verwalten. Die Anbieter müssen hierbei Sicherheit und Vertraulichkeit von Nutzerdaten sicherstellen. Dazu sind Passwörter weiterhin die geläufigste Authentifizierungsmethode im Internet. Sie sind kostengünstig und einfach zu implementieren. NutzerInnen sind bereits im Umgang mit diesem Verfahren vertraut jedoch stellen Passwörter ein beträchtliches Ärgernis dar, weil sie mühsam zu erstellen, einzuprägen, und verwalten sind. Oft werden Usabilityfragen zu Sicherheitsproblemen, weil NutzerInnen Herausforderungen umschiffen und sich einfach zu erratende Zugangsdaten ausdenken. Daneben verwenden sie Passwörter für viele Zwecke wieder, was das Risiko eines Identitätsdiebstals weiter erhöht. Es gibt zahlreiche Versuche die Wurzel des Problems zu beseitigen und Passwörter zu ersetzen, z.B. mit Biometrie. Jedoch kann bisher kein anderes Verfahren sie vollkommen ersetzen, so dass Passwörter wohl für absehbare Zeit die Hauptauthentifizierungsmethode bleiben werden. ExpertInnen aus Forschung und Industrie haben sich deshalb zum Ziel gefasst, die Situation der NutzerInnen auf verschiedene Wege zu verbessern. Es existieren zwei Forschungsstränge darüber wie man NutzerInnen bei der Erstellung von sicheren und benutzbaren Passwörtern helfen kann. Auf der einen Seite haben Regeln bei der Passworterstellung deutliche Auswirkungen auf Passwortpraktiken, weil sie bestimmte Charakteristiken durchsetzen. Jedoch reduziert diese Durchsetzung die Autonomie der NutzerInnen und verursacht Frustration, wenn die Anforderungen schlecht kommuniziert oder übermäßig komplex sind. Auf der anderen Seite stehen nutzerzentrierte Designs: Hilfestellung und Überzeugungsarbeit sind typischerweise nutzerfreundlicher wobei ihr Einfluss begrenzt ist. In dieser Arbeit erkunden wir die potenziellen Gründe für die Ineffektivität bestimmter Überzeugungsstrategien. Von dem hierbei gewonnenen Wissen leiten wir neue persuasive Designelemente für Hilfestellung bei der Passwortauthentifizierung ab. Die Exploration von Kontextfaktoren im Umgang mit Passwörtern basiert auf vier Projekten, die sowohl psychologische Aspekte als auch Einschränkungen in der Praxis aufdecken. Hierbei untersuchen wir inwiefern Mental Modelle von Passwortstärke und -managern wichtige Hinweise auf das Design von persuasiven Interventionen liefern. Darüber hinaus werden die Zusammenhänge zwischen Persönlichkeitsmerkmalen und Passwortpraktiken in drei Nutzerstudien untersucht. Eine gründliche Überprüfung von Passwortregeln in der Praxis zeigt die Einschränkungen für Passwortselektion und -wiederverwendung. Basierend auf der Durchleuchtung der Kontextfaktoren erweitern wir hierauf den Design-Raum von persuasiver Passworthilfestellung mit drei Projekten. Zuerst schildern wir die expliziten und impliziten Bedürfnisse in punkto Hilfestellung. Daraufhin erstellen und evaluieren wir eine Entscheidungsarchitektur, welche veranschaulicht wie ein Phänomen aus der Marketingpsychologie neue Einsichten in das Design von Nudging-Strategien liefern kann. Im Schlussgang versuchen wir NutzerInnen dabei zu stärken, gut merkbare Passwörter mit Hilfe von Emojis zu erstellen. Die Ergebnisse zeigen die Herausforderungen und Potenziale von Emoji-Passwörtern auf verschiedenen Plattformen. Zuletzt präsentiert diese Arbeit ein Rahmenkonzept für das persuasive Design von Passworthilfestellungen. Es soll die benötigten Aktivitäten während des gesamten Prozesses strukturieren. Dies erlaubt ExpertInnen neuartige Systeme zu entwickeln, die über traditionelle Ansätze hinausgehen, was durch eine Designstudie veranschaulicht wird

    On the contribution of the electromagnetic dipole operator O7{\cal O}_7 to the Bˉsμ+μ\bar B_s \to \mu^+\mu^- decay amplitude

    Full text link
    We construct a factorization theorem that allows to systematically include QCD corrections to the contribution of the electromagnetic dipole operator in the effective weak Hamiltonian to the Bˉsμ+μ\bar B_s \to \mu^+\mu^- decay amplitude. We first rederive the known result for the leading-order QED box diagram, which features a double-logarithmic enhancement associated to the different rapidities of the light quark in the Bˉs\bar B_s meson and the energetic muons in the final state. We provide a detailed analysis of the cancellation of the related endpoint divergences appearing in individual momentum regions, and show how the rapidity logarithms can be isolated by suitable subtractions applied to the corresponding bare factorization theorem. This allows us to include in a straightforward manner the QCD corrections arising from the renormalization-group running of the hard matching coefficient of the electromagnetic dipole operator in soft-collinear effective theory, the hard-collinear scattering kernel, and the BsB_s-meson distribution amplitude. Focusing on the contribution from the double endpoint logarithms, we derive a compact formula that resums the leading-logarithmic QCD corrections.Comment: 33 pages, 3 figure

    Improvement of Machine Learning Models for Time Series Forecasting in Radial-Axial Ring Rolling through Transfer Learning

    Get PDF
    Due to the increasing computing power and corresponding algorithms, the use of machine learning (ML) in production technology has risen sharply in the age of Industry 4.0. Data availability in particular is fundamental at this point and a prerequisite for the successful implementation of a ML application. If the quantity or quality of data is insufficient for a given problem, techniques such as data augmentation, the use of synthetic data and transfer learning of similar data sets can provide a remedy. In this paper, the concept of transfer learning is applied in the field of radial-axial ring rolling (rarr) and implemented using the example of time series prediction of the outer diameter over the process time. Radial-axial ring rolling is a hot forming process and is used for seamless ring production

    Extending the Touchscreen Pattern Lock Mechanism with Duplicated and Temporal Codes

    Get PDF
    We investigate improvements to authentication on mobile touchscreen phones and present a novel extension to the widely used touchscreen pattern lock mechanism. Our solution allows including nodes in the grid multiple times, which enhances the resilience to smudge and other forms of attack. For example, for a smudge pattern covering 7 nodes, our approach increases the amount of possible lock patterns by a factor of 15 times. Our concept was implemented and evaluated in a laboratory user test (n = 36). The test participants found the usability of the proposed concept to be equal to that of the baseline pattern lock mechanism but considered it more secure. Our solution is fully backwards-compatible with the current baseline pattern lock mechanism, hence enabling easy adoption whilst providing higher security at a comparable level of usability

    UWB Channel Impulse Responses for Positioning in Complex Environments: A Detailed Feature Analysis

    Get PDF
    Radio signal-based positioning in environments with complex propagation paths is a challenging task for classical positioning methods. For example, in a typical industrial environment, objects such as machines and workpieces cause reflections, diffractions, and absorptions, which are not taken into account by classical lateration methods and may lead to erroneous positions. Only a few data-driven methods developed in recent years can deal with these irregularities in the propagation paths or use them as additional information for positioning. These methods exploit the channel impulse responses (CIR) that are detected by ultra-wideband radio systems for positioning. These CIRs embed the signal properties of the underlying propagation paths that represent the environment. This article describes a feature-based localization approach that exploits machine-learning to derive characteristic information of the CIR signal for positioning. The approach is complete without highly time-synchronized receiver or arrival times. Various features were investigated based on signal propagation models for complex environments. These features were then assessed qualitatively based on their spatial relationship to objects and their contribution to a more accurate position estimation. Three datasets collected in environments of varying degrees of complexity were analyzed. The evaluation of the experiments showed that a clear relationship between the features and the environment indicates that features in complex propagation environments improve positional accuracy. A quantitative assessment of the features was made based on a hierarchical classification of stratified regions within the environment. Classification accuracies of over 90% could be achieved for region sizes of about 0.1 m 2 . An application-driven evaluation was made to distinguish between different screwing processes on a car door based on CIR measures. While in a static environment, even with a single infrastructure tag, nearly error-free classification could be achieved, the accuracy of changes in the environment decreases rapidly. To adapt to changes in the environment, the models were retrained with a small amount of CIR data. This increased performance considerably. The proposed approach results in highly accurate classification, even with a reduced infrastructure of one or two tags, and is easily adaptable to new environments. In addition, the approach does not require calibration or synchronization of the positioning system or the installation of a reference system

    Identification Of Investigation Procedures To Predict Work Roll Fatigue For Developing Machine Learning Applications – A Systematic Literature Review

    Get PDF
    Machine learning approaches present significant opportunities for optimizing existing machines and production systems. Particularly in hot rolling processes, great potential for optimization can be exploited. Radial-axial ring rolling is a crucial process utilized to manufacture seamless rings. However, the failure of the mandrel represents a defect within the ring rolling process that currently cannot be adequately explained. Mandrel failure is unpredictable, occurs without a directly identifiable reason, and can appear several times a week depending on the ring rolling mill and capacity utilization. Broken rolls lead to unscheduled production downtimes, defective rings and can damage other machine parts. Considering the extensive recording of production data in ring rolling, the implementation of machine learning models for the prediction of such roll breaks offers great potential. To present a comprehensive overview of the potential influencing factors which are possibly relevant to the lifetime of mandrels, a systematic literature review (SLR) focusing on work roll wear in hot rolling processes is conducted. Based on the results of the SLR, a first selection of features and the used investigation procedures are presented. The insights can be used for the prediction of mandrel failure with machine learning models in further work

    Passquerade: Improving Error Correction of Text Passwords on Mobile Devices by using Graphic Filters for Password Masking

    Get PDF
    Entering text passwords on mobile devices is a significant challenge. Current systems either display passwords in plain text: making them visible to bystanders, or replace characters with asterisks shortly after they are typed: making editing them harder. This work presents a novel approach to mask text passwords by distorting them using graphical filters. Distorted passwords are difficult to observe by attackers because they cannot mentally reverse the distortions. Yet passwords remain readable by their owners because humans can recognize visually distorted versions of content they saw before. We present results of an online questionnaire and a user study where we compared Color-halftone, Crystallize, Blurring, and Mosaic filters to Plain text and Asterisks when 1) entering, 2) editing, and 3) shoulder surfing one-word passwords, random character passwords, and passphrases. Rigorous analysis shows that Color-halftone and Crystallize filters significantly improve editing speed, editing accuracy and observation resistance compared to current approaches

    Early evaluation of corneal collagen crosslinking in ex-vivo human corneas using two-photon imaging

    Get PDF
    The clinical outcome of corneal collagen crosslinking (CXL) is typically evaluated several weeks after treatment. An earlier assessment of its outcome could lead to an optimization of the treatment, including an immediate re-intervention in case of failure, thereby, avoiding additional discomfort and pain to the patient. In this study, we propose two-photon imaging (TPI) as an earlier evaluation method. CXL was performed in human corneas by application of riboflavin followed by UVA irradiation. Autofluorescence (AF) intensity and lifetime images were acquired using a commercial clinically certified multiphoton tomograph prior to CXL and after 2h, 24h, 72h, and 144h storage in culture medium. The first monitoring point was determined as the minimum time required for riboflavin clearance from the cornea. As control, untreated samples and samples treated only with riboflavin (without UVA irradiation) were monitored at the same time points. Significant increases in the stroma AF intensity and lifetime were observed as soon as 2h after treatment. A depth-dependent TPI analysis showed higher AF lifetimes anteriorly corresponding to areas were CXL was most effective. No alterations were observed in the control groups. Using TPI, the outcome of CXL can be assessed non-invasively and label-free much sooner than with conventional clinical devices.European Union Horizon 2020 (LASER-HISTO); European project FLIMVERTIC
    corecore